# T5 Architecture Optimization
Rut5 Base Headline Generation
A Russian news headline generation model based on the T5 architecture, optimized for short news texts, capable of generating summary-style headlines with 6-11 words
Text Generation
Transformers Other

R
wanderer-msk
65
1
Udop Large 512
MIT
UDOP is a universal document processing model that unifies vision, text, and layout, based on the T5 architecture, suitable for tasks such as document image classification, parsing, and visual question answering.
Image-to-Text
Transformers

U
microsoft
193
5
Ptt5 Base Summ
MIT
A Brazilian Portuguese abstract summarization generation model fine-tuned based on PTT5, supporting concise summary generation from texts such as news articles.
Text Generation
Transformers Other

P
recogna-nlp
853
0
Title Generator
Apache-2.0
This model is based on the t5-small architecture, designed to automatically generate concise titles from lengthy medical research abstracts.
Text Generation
Transformers English

T
TusharJoshi89
67
3
Flan T5 Large Squad2
MIT
An extractive QA model fine-tuned on the SQuAD2.0 dataset based on flan-t5-large, capable of handling both answerable and unanswerable questions.
Question Answering System
Transformers English

F
sjrhuschlee
57
5
Long Ke T5 Base Translation Aihub Bidirection
Apache-2.0
A multi-domain Korean-English bidirectional translation model fine-tuned based on KETI-AIR/long-ke-t5-base, supporting translation tasks in food, technology, socio-technology, and colloquial domains.
Machine Translation
Transformers Supports Multiple Languages

L
KETI-AIR-Downstream
79
3
Ptt5 Base Summ Cstnews
MIT
A Brazilian Portuguese text abstract summarization model fine-tuned based on PTT5
Text Generation
Transformers Other

P
recogna-nlp
47
8
Ptt5 Base Summ Temario
MIT
A model fine-tuned based on PTT5 for generating abstractive summaries of Brazilian Portuguese texts.
Text Generation
Transformers Other

P
recogna-nlp
159
1
Ptt5 Base Summ Wikilingua
MIT
A fine-tuned Brazilian Portuguese abstractive text summarization model based on PTT5, supporting various summarization tasks.
Text Generation
Transformers Other

P
recogna-nlp
38
4
Scifive Large Pubmed
SciFive is a text-to-text transformation model specifically designed for biomedical literature, based on the T5 architecture, focusing on handling text tasks in the biomedical field.
Large Language Model English
S
razent
63
2
T5 Base Qg Hl
MIT
A T5-base architecture-trained answer-aware question generation model capable of generating corresponding questions based on highlighted answer segments in the text.
Question Answering System
Transformers

T
valhalla
11.65k
12
T5 Base Japanese Web
Apache-2.0
T5 model pretrained on Japanese web text with byte fallback support and a 32K vocabulary size
Large Language Model
Transformers Japanese

T
megagonlabs
4,917
19
Code Trans T5 Base Code Documentation Generation Java
A T5-based Java code documentation generation model, specifically designed to generate descriptive documentation for Java functions
Large Language Model
C
SEBIS
22
1
Ke T5 Large Ko
Apache-2.0
A T5 model pre-trained on Korean and English, supporting cross-lingual knowledge-driven response generation
Large Language Model Korean
K
KETI-AIR
17
3
Code Trans T5 Base Source Code Summarization Python
A pre-trained model based on the T5 architecture, specifically designed for generating functional summaries of Python code
Text Generation
C
SEBIS
27
0
Code Trans T5 Base Code Documentation Generation Javascript
A T5-based code transformation model specifically designed for generating documentation descriptions of JavaScript functions
Text Generation
C
SEBIS
26
1
Code Trans T5 Base Api Generation
A pre-trained model based on the t5-base architecture, specifically designed for Java API recommendation generation tasks
Large Language Model
C
SEBIS
105
2
Code Trans T5 Base Source Code Summarization Python Multitask
A pre-trained model based on the T5 architecture, specifically designed for Python code summarization with multi-task support
Large Language Model
C
SEBIS
57
1
Ke T5 Small Ko
Apache-2.0
A T5 model pretrained on Korean and English, suitable for open-domain dialogue systems
Large Language Model Korean
K
KETI-AIR
287
1
Code Trans T5 Base Code Documentation Generation Python
A T5-based model specialized in generating descriptive documentation for Python functions
Text Generation
C
SEBIS
144
12
Code Trans T5 Small Code Documentation Generation Javascript
A pre-trained model based on the t5-small architecture, specifically designed for generating documentation descriptions for JavaScript functions
Text Generation
C
SEBIS
26
0
Code Trans T5 Base Api Generation Transfer Learning Finetune
A pre-trained model based on the T5 architecture, specifically designed for Java API recommendation generation tasks, optimized through transfer learning
Large Language Model
C
SEBIS
18
0
Featured Recommended AI Models